inference workloads Flash News List | Blockchain.News
Flash News List

List of Flash News about inference workloads

Time Details
2025-12-28
23:38
AI Compute Demand Will Outstrip Supply, Says @gdb — 200B Token Usage Highlights LLM Workload Boom

According to @gdb, AI compute demand will continuously exceed supply because greater compute increases the multiplier on progress toward goals, as evidenced by recent usage data. source: @gdb on X, Dec 28, 2025. He cites developer Rafael Bittencourt reporting 100 billion tokens used in 39 days on one laptop via Codex CLI with GPT-5.2 Codex xhigh, plus another 68 billion on a second laptop and around 200 billion tokens total across three OpenAI Pro accounts. source: Rafael Bittencourt on X; @gdb on X. Bittencourt states that two months of OpenAI Pro subscriptions at US$200 each amounted to about 6 percent of what pure per-token pricing would have cost, indicating strong incentive for very high throughput workloads under flat-rate access. source: Rafael Bittencourt on X. For traders, these metrics signal persistent, usage-driven demand for inference compute capacity and bandwidth, a key input for AI infrastructure and decentralized compute markets that can tighten supply-demand conditions. source: @gdb on X; Rafael Bittencourt on X.

Source